591 research outputs found

    CDM Implementation in Domestic Energy Sector : Indian Scenario

    Get PDF
    Since the Kyoto protocol agreement, Clean Development Mechanism (CDM) has garnered large emphasis in terms of certified emission reductions (CER) not only amidst the global carbon market but also in India. This paper attempts to assess the impact of CDM towards sustainable development particularly in rural domestic cooking sector. A detailed survey was undertaken in the state of Kerala, in southern part of India to evaluate the rural domestic energy consumption pattern. The data collected was analyzed through standard statistical software yielding insight into the interrelationships of the various parameters that influence domestic cooking energy consumption. The analysis facilitates assessing feasibility of CDM projects in the sector and related prospects pertaining to the Indian scenario.

    Coupled simulation-optimization model for coastal aquifer management using genetic programming-based ensemble surrogate models and multiple-realization optimization

    Get PDF
    Approximation surrogates are used to substitute the numerical simulation model within optimization algorithms in order to reduce the computational burden on the coupled simulation-optimization methodology. Practical utility of the surrogate-based simulation-optimization have been limited mainly due to the uncertainty in surrogate model simulations. We develop a surrogate-based coupled simulation-optimization methodology for deriving optimal extraction strategies for coastal aquifer management considering the predictive uncertainty of the surrogate model. Optimization models considering two conflicting objectives are solved using a multiobjective genetic algorithm. Objectives of maximizing the pumping from production wells and minimizing the barrier well pumping for hydraulic control of saltwater intrusion are considered. Density-dependent flow and transport simulation model FEMWATER is used to generate input-output patterns of groundwater extraction rates and resulting salinity levels. The nonparametric bootstrap method is used to generate different realizations of this data set. These realizations are used to train different surrogate models using genetic programming for predicting the salinity intrusion in coastal aquifers. The predictive uncertainty of these surrogate models is quantified and ensemble of surrogate models is used in the multiple-realization optimization model to derive the optimal extraction strategies. The multiple realizations refer to the salinity predictions using different surrogate models in the ensemble. Optimal solutions are obtained for different reliability levels of the surrogate models. The solutions are compared against the solutions obtained using a chance-constrained optimization formulation and single-surrogate-based model. The ensemble-based approach is found to provide reliable solutions for coastal aquifer management while retaining the advantage of surrogate models in reducing computational burden

    Genetic Programming: Efficient Modeling Tool in Hydrology and Groundwater Management

    Get PDF
    [Extract] With the advent of computers a wide range of mathematical and numerical models have been developed with the intent of predicting or approximating parts of hyrdrologic cycle. Prior to the advent of conceptual process based models, physical hydraulic models, which are reduced scale representations of large hydraulic systems, were used commonly in water resources engineering. Fast development in the computational systems and numerical solutions of complex differential equations enabled development of conceptual models to represent physical systems. Thus, in the last two decades large number of mathematical models was developed to represent different processes in hydrological cycle

    COST EFFECTIVENESS OF RECYCLING: A SYSTEMS MODEL

    Get PDF
    Financial analytical models of waste management systems have often found that recycling costs exceed direct benefits, and in order to economically justify recycling activities, externalities such as household expenses or environmental impacts must be invoked. Certain more empirically based studies have also found that recycling is more expensive than disposal. Other work, both through models and surveys, have found differently. Here we present an empirical systems model, largely drawn from a suburban Long Island municipality. The model accounts for changes in distribution of effort as recycling tonnages displace disposal tonnages, and the seven different cases examined all show that curbside collection programs that manage up to between 31% and 37% of the waste stream should result in overall system savings. These savings accrue partially because of assumed cost differences in tip fees for recyclables and disposed wastes, and also because recycling can result in a more efficient, cost-effective collection program. These results imply that increases in recycling are justifiable due to cost-savings alone, not on more difficult to measure factors that may not impact program budgets

    Second order hydrodynamics based on effective kinetic theory and electromagnetic signals from QGP

    Full text link
    We study thermal particle production from relativistic heavy ion collisions in presence of viscosities by employing the recently developed second order dissipative hydrodynamic formulation estimated within a quasiparticle description of thermal QCD (Quantum Chromo-Dynamics) medium. The sensitivity of shear and bulk viscous pressures to the temperature dependence of relaxation time is analyzed within one dimensional boost invariant expansion of quark gluon plasma (QGP). The dissipative corrections to the phase-space distribution functions are obtained from the Chapman-Enskog like iterative solution of effective Boltzmann equation in the relaxation time approximation. Thermal dilepton and photon production rates for QGP are calculated by employing this viscous modified distribution function. Particle emission yields are quantified for the longitudinal expansion of QGP with different temperature dependent relaxation times. Our analysis employing this second order hydrodynamic model indicates that the particle spectra gets enhanced by both bulk and shear viscosities and is well behaved. Also, the particle yields are found to be sensitive to relaxation time.Comment: 12 pages, 9 figure

    FORMULATION DEVELOPMENT AND IN VITRO EVALUATION OF SUSTAINED RELEASE MATRIX TABLETS OF BOSENTAN BY USING SYNTHETIC POLYMERS

    Get PDF
    Objective: Bosentan is an endothelin receptor antagonist (ERA) indicated for the treatment of Pulmonary arterial hypertension (PAH). The aim of the present study involves the development of sustained release matrix tablets of bosentan in order to release the drug in sustained and predictable manner. Methods: Bosentan SR Matrix tablets were prepared by Wet granulation method. The tablets were evaluated for Hardness, Thickness, Friability and Drug content and were subjected to a 12 hours in vitro drug release studies. Results: The amount of Bosentan released from the tablet formulations at different time intervals was estimated using a UV Spectroscopy method. Among all the formulations are prepared by using different polymers like HPMC K 4 M, HPMC K15 M at different ratios. Conclusion: We Can Conclude that Among the Ten formulations, F-2 formulation containing drug to HPMC K 4 M in ratio 1:0.5 is optimized based on its ability to sustain drug release till 12 hours of dissolution study, The results of the study clearly demonstrated that HPMC matrix tablet formulation is an effective and promising drug delivery system for once daily administration of Bosentan

    Scene Segmentation and Classification

    Get PDF
    In this thesis work we propose a novel method for video segmentation and classification, which are important tasks in indexing and retrieval of videos. Video indexing techniques requires the video to be segmented effectively into smaller meaningful units shots. Because of huge volumes of digital data and their dimensionality, indexing the data in shot level is a tough task. Scene classification has become a challenging and important problem in recent years because of its efficiency in video indexing. The main issue in video segmentation is the selection of features that are robust to false illuminations and object motion. Shot boundary detection algorithm is proposed which detects both the abrupt and gradual transitions simultaneously. Each shot is represented using a key-frame(s). The key-frame is a still image of a shot or it is a cumulative histogram representation that best represents the content of a shot. From each shot one or multiple key frame(s) are extracted. This research work presents a new method for segmenting videos into scenes. Scene is defined as a sequence of shots that are semantically co-related. Shots from a scene will have similar color content, background information. The similarity between a pair of shots is the color histogram intersection of the key frames of the two shots. Histogram intersection outputs the count of pixels with similar color in the two frames. Shot similarity matrix with 0 ′ s and 1 ′ s is computed, that outputs the similarity between any two shots. Shots are from the same scene if the similarity between the two shots is 1, else they are from different scenes. Spectral clustering algorithm is used to identify scene boundaries. Shots belonging to scene will form a cluster. A new method is proposed to detect scenes, sequence of shots that are similar will have an edge between them and forms a node. Edge represents the similarity value 1 between shots. SVM classifier is used for scene classification. The experimental results on different data-sets shows that the proposed algorithms can effectively segment and classify digital videos. Key words: Content based video retrieval, video content analysis, video indexing, shot boundary detection, key-frames, scene segmentation, and video classification

    Bulk viscosity in hyperonic star and r-mode instability

    Full text link
    We consider a rotating neutron star with the presence of hyperons in its core, using an equation of state in an effective chiral model within the relativistic mean field approximation. We calculate the hyperonic bulk viscosity coefficient due to nonleptonic weak interactions. By estimating the damping timescales of the dissipative processes, we investigate its role in the suppression of gravitationally driven instabilities in the rr-mode. We observe that rr-mode instability remains very much significant for hyperon core temperature of around 10810^8 K, resulting in a comparatively larger instability window. We find that such instability can reduce the angular velocity of the rapidly rotating star considerably upto 0.04ΩK\sim0.04 \Omega_K, with ΩK\Omega_K as the Keplerian angular velocity.Comment: 10 pages including 7 figure

    Visualization Of Supersonic Flows In Shock Tunnels, Using The Background Oriented Schlieren Technique

    Get PDF
    Visualisation of supersonic compressible flows using the Background Oriented Schlieren (BOS) technique is presented. Results from experiments carried out in a reflected shock tunnel with models of a 20-degree semi-vertex angle circular cone and a re-entry body in the test section are presented. This technique uses a simple optical set-up consisting of a structured background pattern, an electronic camera with a high shutter speed and a high intensity light source. Tests were conducted with a Mach 4 conical nozzle, with nozzle supply pressure of 2 MPa and nozzle supply temperature of 2000 K respectively. The images captured during the test were compared using PIV style image processing code. The intensity of light at each point in the processed image was proportional to the density at that point. Qualitative visualization of shock shapes, with images clearly indicating regions of subsonic and supersonic flows was achieved. For the cone, the shock angle measured from the BOS image agreed with theoretical calculations to within 0.5 degrees. Shock standoff distances could be measured from the BOS image for the re-entry body

    Design of Low Power and Area Efficient Carry Select Adder (CSLA) using Verilog Language

    Full text link
    Carry select method has deemed to be a good compromise between cost and performance in carry propagation adder design. However conventional carry select adder (CSLA) is still area consuming due to the dual ripple carry adder structure. The excessive area overhead makes conventional carry select adder (CSLA) relatively unattractive but this has been the circumvented by the use of add-one circuit. In this an area efficient modified CSLA scheme based on a new first zero detection logic is proposed. The gate count in 32-bit modified CSLA can be greatly reduced, design proposed in this paper has been developed using VERILOG language and synthesized in XILINX13.2 version
    corecore